Representing Statistical Information and Degrees of Belief in First-Order Probabilistic Conditional Logic
نویسنده
چکیده
Employing maximum entropy methods on probabilistic conditional logic has proven to be a useful approach for commonsense reasoning. Yet, the expressive power of this logic and similar formalisms is limited due to their foundations on propositional logic and in the past few years a lot of proposals have been made for probabilistic reasoning in relational settings. Most of these proposals rely on extensions of traditional graph-based probabilistic models like Bayes nets or Markov nets whereas probabilistic conditional logic does not presuppose any graphical structure underlying the model to be represented. In this paper we take an approach of lifting maximum entropy methods to the relational case by using a first-order version of probabilistic conditional logic. Furthermore, we take a specific focus on representing relational probabilistic knowledge by differentiating between different intuitions on relational probabilistic conditionals, namely between statistical interpretations and interpretations on degrees of belief. We develop a list of desirable properties on an inference procedure that supports these different interpretations and propose a specific inference procedure that fulfills these properties. We furthermore discuss related work and give some hints on future research.
منابع مشابه
Probabilistic Reasoning at Optimum Entropy with the MEcore System
Augmenting probabilities to conditional logic yields an expressive mechanism for representing uncertainty. The principle of optimum entropy allows one to reason in probabilistic logic in an information-theoretic optimal way by completing the given information as unbiasedly as possible. In this paper, we introduce the MECoRe system that realises the core functionalities for an intelligent agent ...
متن کاملLp: A Logic for Statistical Information
This extended abstract presents a logic, called Lp, that is capable of representing and reasoning with a wide variety of both qualitative and quantitative statistical information. The advantage of this logical formalism is that it offers a declarative representation of statistical knowledge; knowledge represented in this manner can be used for a variety of reasoning tasks. The logic differs fro...
متن کاملConditional probabilistic reasoning without conditional logic
Imaging is a class of non-Bayesian methods for the revision of probability density functions originally proposed as a semantics for conditional logic. Two of these revision functions, Standard Imaging and General Imaging, have successfully been applied to modelling information retrieval (IR). Due to the problematic nature of a“direct” implementation of Imaging revision functions, we propose the...
متن کاملBisimulation and expressivity for conditional belief, degrees of belief, and safe belief
Plausibility models are Kripke models that agents use to reason about knowledge and belief, both of themselves and of each other. Such models are used to interpret the notions of conditional belief, degrees of belief, and safe belief. The logic of conditional belief contains that modality and also the knowledge modality, and similarly for the logic of degrees of belief and the logic of safe bel...
متن کاملPhD Dissertation: Propositional Reasoning that Tracks Probabilistic Reasoning
Bayesians model one’s doxastic state by subjective probabilities. But in traditional epistemology, in logic-based artificial intelligence, and in everyday life, one’s doxastic state is usually expressed in a qualitative, binary way: either one accepts (believes) a proposition or one does not. What is the relationship between qualitative and probabilistic belief? I show that, besides the familia...
متن کامل